翻訳と辞書
Words near each other
・ Additive identity
・ Additive increase/multiplicative decrease
・ Additive inverse
・ Additive K-theory
・ Additive Manufacturing File Format
・ Additive map
・ Additive Markov chain
・ Additive model
・ Additive number theory
・ Additive polynomial
・ Additive rhythm and divisive rhythm
・ Additive Schwarz method
・ Additive smoothing
・ Additive State Decomposition
・ Additive synthesis
Additive white Gaussian noise
・ Additively indecomposable ordinal
・ Additives for cut flowers
・ Additron tube
・ ADDitude Magazine
・ Additur
・ Addizione Erculea
・ Addled Parliament
・ Addleshaw Booth & Co
・ Addleshaw Goddard
・ Addleshaw Tower
・ Addlestone
・ Addlestone & Weybridge Town F.C.
・ Addlestone railway station
・ Addlethorpe


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Additive white Gaussian noise : ウィキペディア英語版
Additive white Gaussian noise
Additive white Gaussian noise (AWGN) is a basic noise model used in Information theory to mimic the effect of many random processes that occur in nature. The modifiers denote specific characteristics:
* ''Additive'' because it is added to any noise that might be intrinsic to the information system.
* ''White'' refers to the idea that it has uniform power across the frequency band for the information system. It is an analogy to the color white which has uniform emissions at all frequencies in the visible spectrum.
* ''Gaussian'' because it has a normal distribution in the time domain with an average time domain value of zero.
Wideband noise comes from many natural sources, such as the thermal vibrations of atoms in conductors (referred to as thermal noise or Johnson-Nyquist noise), shot noise, black body radiation from the earth and other warm objects, and from celestial sources such as the Sun. The central limit theorem of probability theory indicates that the summation of many random processes will tend to have distribution called Gaussian or Normal.
AWGN is often used as a channel model in which the only impairment to communication is a linear addition of wideband or white noise with a constant spectral density (expressed as watts per hertz of bandwidth) and a Gaussian distribution of amplitude. The model does not account for fading, frequency selectivity, interference, nonlinearity or dispersion. However, it produces simple and tractable mathematical models which are useful for gaining insight into the underlying behavior of a system before these other phenomena are considered.
The AWGN channel is a good model for many satellite and deep space communication links. It is not a good model for most terrestrial links because of multipath, terrain blocking, interference, etc. However, for terrestrial path modeling, AWGN is commonly used to simulate background noise of the channel under study, in addition to multipath, terrain blocking, interference, ground clutter and self interference that modern radio systems encounter in terrestrial operation.
==Channel capacity==
The AWGN channel is represented by a series of outputs Y_i at discrete time event index i. Y_i is the sum of the input X_i and noise, Z_i, where Z_i is independent and identically distributed and drawn from a zero-mean normal distribution with variance N (the noise). The Z_i are further assumed to not be correlated with the X_i.
:
Z_i \sim \mathcal(0, N)
\,\!
:
Y_i = X_i + Z_i\sim \mathcal(X_i, N).
\,\!
The capacity of the channel is infinite unless the noise n is nonzero, and the X_i are sufficiently constrained. The most common constraint on the input is the so-called "power" constraint, requiring that for a codeword (x_1, x_2, \dots , x_k) transmitted through the channel, we have:
:
\frac\sum_^k x_i^2 \leq P,

where P represents the maximum channel power.
Therefore, the channel capacity for the power-constrained channel is given by:
:
C = \max_ I(X;Y)
\,\!
Where f(x) is the distribution of X. Expand I(X;Y), writing it in terms of the differential entropy:
:
\begin
I(X;Y) = h(Y) - h(Y|X)
&= h(Y)-h(X+Z|X)
&= h(Y)-h(Z|X)
\end
\,\!
But X and Z are independent, therefore:
:
I(X;Y) = h(Y) - h(Z)
\,\!
Evaluating the differential entropy of a Gaussian gives:
:
h(Z) = \frac \log(2 \pi e N)
\,\!
Because X and Z are independent and their sum gives Y:
:
E(Y^2) = E(X+Z)^2 = E(X^2) + 2E(X)E(Z)+E(Z^2) = P + N
\,\!
From this bound, we infer from a property of the differential entropy that
:
h(Y) \leq \frac \log(2 \pi e(P+N))
\,\!
Therefore the channel capacity is given by the highest achievable bound on the mutual information:
:
I(X;Y) \leq \frac\log(2 \pi e (P+N)) - \frac \log(2 \pi e N)
\,\!
Where I(X;Y) is maximized when:
:
X \sim \mathcal(0, P)
\,\!
Thus the channel capacity C for the AWGN channel is given by:
:
C = \frac \log\left(1+\frac\right)
\,\!

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Additive white Gaussian noise」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.